Regularization Methods for Sum of Squares Relaxations in Large Scale Polynomial Optimization
نویسنده
چکیده
We study how to solve sum of squares (SOS) and Lasserre’s relaxations for large scale polynomial optimization. When interior-point type methods are used, typically only small or moderately large problems could be solved. This paper proposes the regularization type methods which would solve significantly larger problems. We first describe these methods for general conic semidefinite optimization, and then apply them to solve large scale polynomial optimization. Their efficiency is demonstrated by extensive numerical computations. In particular, a general dense quartic polynomial optimization with 100 variables would be solved on a regular computer, which is almost impossible by applying prior existing SOS solvers.
منابع مشابه
Regularization Methods for SDP Relaxations in Large-Scale Polynomial Optimization
We study how to solve semidefinite programming (SDP) relaxations for large scale polynomial optimization. When interior-point methods are used, typically only small or moderately large problems could be solved. This paper studies regularization methods for solving polynomial optimization problems. We describe these methods for semidefinite optimization with block structures, and then apply them...
متن کاملSolving polynomial least squares problems via semidefinite programming relaxations
A polynomial optimization problem whose objective function is represented as a sum of positive and even powers of polynomials, called a polynomial least squares problem, is considered. Methods to transform a polynomial least squares problem to polynomial semidefinite programs to reduce degrees of the polynomials are discussed. Computational efficiency of solving the original polynomial least sq...
متن کاملSummary of Ph.D. Dissertation: Global Optimization of Polynomial Functions and Applications
where f(x) is a real multivariate polynomial in x ∈ Rn and S is a feasible set defined by polynomial equalities or inequalities. In this thesis, we do not have any convexity/concavity assumptions on f(x) or S. The goal is to find the global minimum and global minimizers if any. Polynomial optimization of form (1.1) is quite general in practical applications. Many NP-hard problems like max cut, ...
متن کاملSparse SOS Relaxations for Minimizing Functions that are Summations of Small Polynomials
This paper discusses how to find the global minimum of functions that are summations of small polynomials (“small” means involving a small number of variables). Some sparse sum of squares (SOS) techniques are proposed. We compare their computational complexity and lower bounds with prior SOS relaxations. Under certain conditions, we also discuss how to extract the global minimizers from these s...
متن کاملOn the Hardest Problem Formulations for the 0/1 0 / 1 Lasserre Hierarchy
The Lasserre/Sum-of-Squares (SoS) hierarchy is a systematic procedure for constructing a sequence of increasingly tight semidefinite relaxations. It is known that the hierarchy converges to the 0/1 polytope in n levels and captures the convex relaxations used in the best available approximation algorithms for a wide variety of optimization problems. In this paper we characterize the set of 0/1 ...
متن کامل